# Pre-trained model

Pi0 Pre Train 100
Apache-2.0
A robot strategy model trained based on the LeRobot framework, suitable for robot control tasks
Multimodal Fusion Safetensors
P
Ziang-Li
136
0
Qwen3 14b Ug40 Pretrained
This is an automatically generated transformers model card, lacking specific model information.
Large Language Model Transformers
Q
jq
1,757
1
Openba V1 Based
Apache-2.0
OpenBA is an open-source 15-billion-parameter bilingual asymmetric sequence-to-sequence model, pre-trained from scratch.
Large Language Model Transformers Supports Multiple Languages
O
OpenNLG
94
10
Iiihi24
MIT
A PyTorch-based semantic segmentation model supporting various encoder architectures, suitable for image segmentation tasks.
Image Segmentation
I
Diamantis99
115
0
Oxford Pet Segmentation
MIT
A PyTorch-based FPN image segmentation model supporting various encoder architectures, suitable for semantic segmentation tasks.
Image Segmentation
O
SimonLiao
53
0
T5 Typo Correction V3
This is a transformers model card automatically generated by the system. Specific model information is to be supplemented.
Large Language Model Transformers
T
Wguy
303
1
Llama3 1 Relevance Dev
This is a transformer model card automatically generated by the system, and specific information is to be supplemented
Large Language Model Transformers
L
qqlabs
7,149
1
Medgpt
This is a Transformers model released by Hugging Face. The specific functions and uses are to be supplemented.
Large Language Model Transformers
M
Rabbiaaa
1,603
1
Vesselfm
Other
VesselFM is a foundation model for universal 3D vascular segmentation in any imaging domain.
Image Segmentation
V
bwittmann
153
4
Speecht5 Base Cs Tts
This is a monolingual Czech SpeechT5 base model, pre-trained on 120,000 hours of Czech audio and a 17.5 billion-word text corpus, designed as a starting point for Czech TTS fine-tuning.
Speech Synthesis Transformers Other
S
fav-kky
66
0
Cubert 20210711 Python 1024
Apache-2.0
CuBERT is a context embedding model based on Python code, specifically designed for source code analysis tasks.
Large Language Model Transformers Other
C
claudios
22
1
Prometheus Bgb 8x7b V2.0
This is a transformers model hosted on Hugging Face Hub, with no specific functionality or purpose clearly stated.
Large Language Model Transformers
P
prometheus-eval
772
6
Tinysolar 248m 4k
Apache-2.0
Large Language Model Transformers
T
upstage
284
7
Speech Accent Classification
Apache-2.0
A foundational speech recognition model based on the Wav2Vec2 architecture, trained on 960 hours of English speech data, suitable for speech classification tasks.
Audio Classification Transformers English
S
dima806
40
4
Emotion English
MIT
An emotion classification model for English text that can identify 20 different emotional states.
Text Classification Transformers Supports Multiple Languages
E
jitesh
211.18k
7
Testmodel
Openrail
An image classification model based on the Hugging Face platform, capable of recognizing common objects such as animals and daily items
Image Classification Transformers
T
changsu
33
0
Ivenpeople V2
A vision image classification model trained on the ImageNet-1k dataset, capable of recognizing multiple common object categories.
Image Classification Transformers
I
jctivensa
19
0
Randeng Deltalm 362M En Zh
A fine-tuned model based on the Deltalm foundation within the Fengshen framework, integrating Chinese-English datasets and iwslt Chinese-English parallel corpora for English-to-Chinese translation.
Machine Translation Transformers Supports Multiple Languages
R
IDEA-CCNL
259
23
Comida Vgm
An image classification model built on PyTorch and HuggingPics, specifically designed for food classification
Image Classification Transformers
C
api19750904
11
0
Trocr Large Str
TrOCR is a Transformer-based optical character recognition model designed for single-line text images, fine-tuned on multiple standard datasets.
Text Recognition Transformers
T
microsoft
571
17
Vit5 Base Vietnews Summarization
MIT
A text summarization model fine-tuned on Vietnamese news datasets based on the ViT5-Base model, supporting automatic summarization of Vietnamese texts.
Text Generation Other
V
VietAI
1,145
7
Hgn Trans En2zh
Apache-2.0
This is an English-to-Chinese translation model fine-tuned based on the Helsinki-NLP/opus-mt-en-zh pre-trained model, using the Tsinghua University Open Chinese Lexicon (THUOCL) dataset for fine-tuning.
Machine Translation Transformers Supports Multiple Languages
H
BubbleSheep
1,069
3
Bert Tiny Uncased
Apache-2.0
This is a tiny version of the BERT model, case-insensitive, suitable for natural language processing tasks in resource-constrained environments.
Large Language Model Transformers English
B
gaunernst
3,297
4
Kosimcse Bert Multitask
KoSimCSE-BERT-multitask is a high-performance Korean sentence embedding model based on BERT architecture and optimized with multi-task learning strategy, specifically designed for generating high-quality Korean sentence embeddings.
Text Embedding Transformers Korean
K
BM-K
827
8
Kosimcse Roberta
A Korean sentence vector embedding model based on the RoBERTa architecture, optimized for sentence representation through contrastive learning, suitable for tasks such as semantic similarity calculation.
Text Embedding Transformers Korean
K
BM-K
10.35k
18
Roomclassifier
An image classification model based on PyTorch that can accurately identify pictures of different room types.
Image Classification Transformers
R
lazyturtl
403
16
Bert Base Uncased
Apache-2.0
A BERT base model for the English language, pre-trained using the Masked Language Modeling (MLM) objective, case-insensitive.
Large Language Model Transformers English
B
OWG
15
0
Bert Base Dutch Cased
A Dutch pre-trained BERT model developed by the University of Groningen, suitable for various Dutch NLP tasks.
Large Language Model
B
wietsedv
32.50k
2
Trocr Large Stage1
TrOCR is a Transformer-based pre-trained model for Optical Character Recognition (OCR) tasks.
Text Recognition Transformers
T
microsoft
3,700
25
Roberta Large
A RoBERTa model pre-trained on Korean, suitable for Korean understanding tasks.
Large Language Model Transformers Korean
R
klue
132.29k
50
Nli MiniLM2 L6 H768
Apache-2.0
A pre-trained natural language inference model based on the MiniLMv2 architecture, used to determine the relationship between sentence pairs (contradiction/entailment/neutral).
Text Classification Transformers English
N
cross-encoder
10.02k
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase